994 research outputs found

    A Feature-Based Comparison of Evolutionary Computing Techniques for Constrained Continuous Optimisation

    Full text link
    Evolutionary algorithms have been frequently applied to constrained continuous optimisation problems. We carry out feature based comparisons of different types of evolutionary algorithms such as evolution strategies, differential evolution and particle swarm optimisation for constrained continuous optimisation. In our study, we examine how sets of constraints influence the difficulty of obtaining close to optimal solutions. Using a multi-objective approach, we evolve constrained continuous problems having a set of linear and/or quadratic constraints where the different evolutionary approaches show a significant difference in performance. Afterwards, we discuss the features of the constraints that exhibit a difference in performance of the different evolutionary approaches under consideration.Comment: 16 Pagesm 2 Figure

    Pippi - painless parsing, post-processing and plotting of posterior and likelihood samples

    Full text link
    Interpreting samples from likelihood or posterior probability density functions is rarely as straightforward as it seems it should be. Producing publication-quality graphics of these distributions is often similarly painful. In this short note I describe pippi, a simple, publicly-available package for parsing and post-processing such samples, as well as generating high-quality PDF graphics of the results. Pippi is easily and extensively configurable and customisable, both in its options for parsing and post-processing samples, and in the visual aspects of the figures it produces. I illustrate some of these using an existing supersymmetric global fit, performed in the context of a gamma-ray search for dark matter. Pippi can be downloaded and followed at http://github.com/patscott/pippi .Comment: 4 pages, 1 figure. v3: Updated for pippi 2.0. New features include hdf5 support, out-of-core processing, inline post-processing with arbitrary Python code in the input file, and observable-specific data cuts. Pippi can be downloaded from http://github.com/patscott/pipp

    Experimental Comparisons of Derivative Free Optimization Algorithms

    Get PDF
    In this paper, the performances of the quasi-Newton BFGS algorithm, the NEWUOA derivative free optimizer, the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), the Differential Evolution (DE) algorithm and Particle Swarm Optimizers (PSO) are compared experimentally on benchmark functions reflecting important challenges encountered in real-world optimization problems. Dependence of the performances in the conditioning of the problem and rotational invariance of the algorithms are in particular investigated.Comment: 8th International Symposium on Experimental Algorithms, Dortmund : Germany (2009

    A hybrid multiagent approach for global trajectory optimization

    Get PDF
    In this paper we consider a global optimization method for space trajectory design problems. The method, which actually aims at finding not only the global minimizer but a whole set of low-lying local minimizers(corresponding to a set of different design options), is based on a domain decomposition technique where each subdomain is evaluated through a procedure based on the evolution of a population of agents. The method is applied to two space trajectory design problems and compared with existing deterministic and stochastic global optimization methods

    Trace and detect adversarial attacks on CNNs using feature response maps

    Get PDF
    The existence of adversarial attacks on convolutional neural networks (CNN) questions the fitness of such models for serious applications. The attacks manipulate an input image such that misclassification is evoked while still looking normal to a human observer – they are thus not easily detectable. In a different context, backpropagated activations of CNN hidden layers – “feature responses” to a given input – have been helpful to visualize for a human “debugger” what the CNN “looks at” while computing its output. In this work, we propose a novel detection method for adversarial examples to prevent attacks. We do so by tracking adversarial perturbations in feature responses, allowing for automatic detection using average local spatial entropy. The method does not alter the original network architecture and is fully human-interpretable. Experiments confirm the validity of our approach for state-of-the-art attacks on large-scale models trained on ImageNet

    Differential evolution for the offline and online optimization of fed-batch fermentation processes

    Get PDF
    The optimization of input variables (typically feeding trajectories over time) in fed-batch fermentations has gained special attention, given the economic impact and the complexity of the problem. Evolutionary Computation (EC) has been a source of algorithms that have shown good performance in this task. In this chapter, Differential Evolution (DE) is proposed to tackle this problem and quite promising results are shown. DE is tested in several real world case studies and compared with other EC algorihtms, such as Evolutionary Algorithms and Particle Swarms. Furthermore, DE is also proposed as an alternative to perform online optimization, where the input variables are adjusted while the real fermentation process is ongoing. In this case, a changing landscape is optimized, therefore making the task of the algorithms more difficult. However, that fact does not impair the performance of the DE and confirms its good behaviour.(undefined

    From feature selection to continuous optimization

    Full text link
    Metaheuristic algorithms (MAs) have seen unprecedented growth thanks to their successful applications in fields including engineering and health sciences. In this work, we investigate the use of a deep learning (DL) model as an alternative tool to do so. The proposed method, called MaNet, is motivated by the fact that most of the DL models often need to solve massive nasty optimization problems consisting of millions of parameters. Feature selection is the main adopted concepts in MaNet that helps the algorithm to skip irrelevant or partially relevant evolutionary information and uses those which contribute most to the overall performance. The introduced model is applied on several unimodal and multimodal continuous problems. The experiments indicate that MaNet is able to yield competitive results compared to one of the best hand-designed algorithms for the aforementioned problems, in terms of the solution accuracy and scalability.Comment: Accepted for EA201

    SQG-Differential Evolution for difficult optimization problems under a tight function evaluation budget

    Full text link
    In the context of industrial engineering, it is important to integrate efficient computational optimization methods in the product development process. Some of the most challenging simulation-based engineering design optimization problems are characterized by: a large number of design variables, the absence of analytical gradients, highly non-linear objectives and a limited function evaluation budget. Although a huge variety of different optimization algorithms is available, the development and selection of efficient algorithms for problems with these industrial relevant characteristics, remains a challenge. In this communication, a hybrid variant of Differential Evolution (DE) is introduced which combines aspects of Stochastic Quasi-Gradient (SQG) methods within the framework of DE, in order to improve optimization efficiency on problems with the previously mentioned characteristics. The performance of the resulting derivative-free algorithm is compared with other state-of-the-art DE variants on 25 commonly used benchmark functions, under tight function evaluation budget constraints of 1000 evaluations. The experimental results indicate that the new algorithm performs excellent on the 'difficult' (high dimensional, multi-modal, inseparable) test functions. The operations used in the proposed mutation scheme, are computationally inexpensive, and can be easily implemented in existing differential evolution variants or other population-based optimization algorithms by a few lines of program code as an non-invasive optional setting. Besides the applicability of the presented algorithm by itself, the described concepts can serve as a useful and interesting addition to the algorithmic operators in the frameworks of heuristics and evolutionary optimization and computing

    Selection models with monotone weight functions in meta analysis

    Full text link
    Publication bias, the fact that studies identified for inclusion in a meta analysis do not represent all studies on the topic of interest, is commonly recognized as a threat to the validity of the results of a meta analysis. One way to explicitly model publication bias is via selection models or weighted probability distributions. We adopt the nonparametric approach initially introduced by Dear (1992) but impose that the weight function ww is monotonely non-increasing as a function of the pp-value. Since in meta analysis one typically only has few studies or "observations", regularization of the estimation problem seems sensible. In addition, virtually all parametric weight functions proposed so far in the literature are in fact decreasing. We discuss how to estimate a decreasing weight function in the above model and illustrate the new methodology on two well-known examples. The new approach potentially offers more insight in the selection process than other methods and is more flexible than parametric approaches. Some basic properties of the log-likelihood function and computation of a pp-value quantifying the evidence against the null hypothesis of a constant weight function are indicated. In addition, we provide an approximate selection bias adjusted profile likelihood confidence interval for the treatment effect. The corresponding software and the datasets used to illustrate it are provided as the R package selectMeta. This enables full reproducibility of the results in this paper.Comment: 15 pages, 2 figures. Some minor changes according to reviewer comment

    Training a Carbon-Nanotube/Liquid Crystal Data Classifier Using Evolutionary Algorithms

    Get PDF
    Evolution-in-Materio uses evolutionary algorithms (EA) to exploit the physical properties of unconfigured, physically rich materials, in effect transforming them into information processors. The potential of this technique for machine learning problems is explored here. Results are obtained from a mixture of single walled carbon nanotubes and liquid crystals (SWCNT/LC). The complex nature of the voltage/current relationship of this material presents a potential for adaptation. Here, it is used as a computational medium evolved by two derivative-free, population-based stochastic search algorithms, particle swarm optimisation (PSO) and differential evolution (DE). The computational problem considered is data classification. A custom made electronic motherboard for interacting with the material has been developed, which allows the application of control signals on the material body. Starting with a simple binary classification problem of separable data, the material is trained with an error minimisation objective for both algorithms. Subsequently, the solution, defined as the combination of the material itself and optimal inputs, is verified and results are reported. The evolution process based on EAs has the capacity to evolve the material to a state where data classification can be performed. PSO outperforms DE in terms of results’ reproducibility due to the smoother, as opposed to more noisy, inputs applied on the material
    • …
    corecore